Local tuning and partition strategies for diagonal GO methods
نویسندگان
چکیده
In this paper, global optimization (GO) Lipschitz problems are considered where the multi-dimensional multiextremal objective function is determined over a hyperinterval. An efficient one-dimensional GO method using local tuning on the behavior of the objective function is generalized to the multi-dimensional case by the diagonal approach using two partition strategies. Global convergence conditions are established for the obtained diagonal geometric methods. Results of a wide numerical comparison show a strong acceleration reached by the new methods working with estimates of the local Lipschitz constants over different subregions of the search domain in comparison with the traditional approach.
منابع مشابه
Adaptive Tuning of Model Predictive Control Parameters based on Analytical Results
In dealing with model predictive controllers (MPC), controller tuning is a key design step. Various tuning methods are proposed in the literature which can be categorized as heuristic, numerical and analytical methods. Among the available tuning methods, analytical approaches are more interesting and useful. This paper is based on a proposed analytical MPC tuning approach for plants can be appr...
متن کاملEfficient Partition of N-Dimensional Intervals in the Framework of One-Point-Based Algorithms
In this paper, the problem of the minimal description of the structure of a vector function f (x) over an N-dimensional interval is studied. Methods adaptively subdividing the original interval in smaller subintervals and evaluating f (x) at only one point within each subin-terval are considered. Two partition strategies traditionally used for solving this problem are analyzed. A new partition ...
متن کاملA local approach to the entropy of countable fuzzy partitions
This paper denes and investigates the ergodic proper-ties of the entropy of a countable partition of a fuzzy dynamical sys-tem at different points of the state space. It ultimately introducesthe local fuzzy entropy of a fuzzy dynamical system and proves itto be an isomorphism invariant.
متن کاملPractical Gauss-Newton Optimisation for Deep Learning
We present an efficient block-diagonal approximation to the Gauss-Newton matrix for feedforward neural networks. Our resulting algorithm is competitive against state-of-the-art first-order optimisation methods, with sometimes significant improvement in optimisation performance. Unlike first-order methods, for which hyperparameter tuning of the optimisation parameters is often a laborious proces...
متن کاملA Differential Evolution and Spatial Distribution based Local Search for Training Fuzzy Wavelet Neural Network
Abstract Many parameter-tuning algorithms have been proposed for training Fuzzy Wavelet Neural Networks (FWNNs). Absence of appropriate structure, convergence to local optima and low speed in learning algorithms are deficiencies of FWNNs in previous studies. In this paper, a Memetic Algorithm (MA) is introduced to train FWNN for addressing aforementioned learning lacks. Differential Evolution...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Numerische Mathematik
دوره 94 شماره
صفحات -
تاریخ انتشار 2003